Goto

Collaborating Authors

 realnvp block



Supplementary Material

Neural Information Processing Systems

Here we elaborate on the details of using SNFs as a variational approximation of the posterior distribution of a variational autoencoder (V AE) [21] as presented in our last results section. All experiments were run using PyTorch 1.2 and on GTX1080Ti cards. NSF block consists of two subsequent NSF layers with intermediate swap layers. "Biased data" is defined by running local Metropolis MC in each of the two wells. "Unbiased data" is produced by running Metropolis MC with a large proposal step (standard The other settings are the same as in Table 1.


Stochastic Normalizing Flows

Wu, Hao, Köhler, Jonas, Noé, Frank

arXiv.org Machine Learning

Normalizing flows are popular generative learning methods that train an invertible function to transform a simple prior distribution into a complicated target distribution. Here we generalize the framework by introducing Stochastic Normalizing Flows (SNF) - an arbitrary sequence of deterministic invertible functions and stochastic processes such as Markov Chain Monte Carlo (MCMC) or Langevin Dynamics. This combination can be powerful as adding stochasticity to a flow helps overcoming expressiveness limitations of a chosen deterministic invertible function, while the trainable flow transformations can improve the sampling efficiency over pure MCMC. Key to our approach is that we can match a marginal target density without having to marginalize out the stochasticity of traversed paths. Invoking ideas from nonequilibrium statistical mechanics, we introduce a training method that only uses conditional path probabilities. We can turn an SNF into a Boltzmann Generator that samples asymptotically unbiased from a given target density by importance sampling of these paths. We illustrate the representational power, sampling efficiency and asymptotic correctness of SNFs on several benchmarks.